Skip to content

Conversation

@abhinavclemson
Copy link
Collaborator

Description

Add vLLM Support for GPT OSS and its mapping generator for tunix.

Before submitting this PR, please make sure (put X in square brackets):

(https://maxtext.readthedocs.io/en/latest/development.html#adding-new-documentation-files).kets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

Copy link
Collaborator

@richjames0 richjames0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm


# TODO: Enable multi-host sharding, if there is a mismatch in shapes.
# # MULTI-HOST case.
val = jax.device_put(val, current.sharding)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

was it tested at multi-host ?

what sharding do you think it might not work?

return current.at[..., 0::2].set(val)

def fuse_interleaved_up(val, tgt_param):
"""Fuse Up (wi_1) with Multi-Host Sharding Support."""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why Multi-Host is special? do you mean multi-devie or multi-host?

From Jax's perspective I think only number of devices (and sharding) matters.

return STANDALONE_VLLM_WEIGHT_MAPPING[self.model_name].to_hf_mapping()
mapping_fn = STANDALONE_VLLM_WEIGHT_MAPPING[self.model_name].to_hf_mapping
total_num_layers = self.config["num_hidden_layers"]
print(f"total_num_layers: {total_num_layers} for model: {self.model_name}")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you remove this print or make debug logging (e.g. logging.debug)?

# TODO: Enable multi-host sharding, if there is a mismatch in shapes.
# # MULTI-HOST case.
val = jax.device_put(val, current.sharding)
val.block_until_ready()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this needed?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants